Solve the differential equation
\[ \begin{equation} ( 1 - x^2 ) y^{\prime \prime} - x y^{\prime} + n^{2} y = 0 \end{equation} \]
Let \(x = cos t\), then
\[ \begin{align} \frac{dx}{dt} &= -sint \\ y^{\prime} &= \frac{dy}{dx} = \frac{dt}{dx} \frac{dy}{dt} = - \frac{1}{sint} \frac{dy}{dt} \\ y^{\prime \prime} &= \frac{d^{2}y}{dx^{2}} = \frac{d}{dx} (\frac{dy}{dx}) = \frac{dt}{dx} \frac{d}{dt} (-\frac{1}{sint} \frac{dy}{dt}) \\ &= - \frac{1}{sin^{2}t} \frac{d^{2}y}{dt^{2}} - \frac{cost}{sin^{3}t} \frac{dy}{dt} \end{align} \]
Employ these differential in the deduction of the original equation. Then the modified equation is acquired.
\[ \begin{equation} \frac{d^{2}y}{dt^{2}} + n^{2} y = 0 \end{equation} \]
Then use the method of characteristic equation to solve this differential equation, the solution is
\[ \begin{align} y(t) &= C_{1} e^{int} + C_{2} e^{-int} \\ &= ( C_{1} + C_{2} ) cosnt + ( i C_{1} - i C_{2} )sinnt \\ &= C cosnt \end{align} \]
First kind of Chebyshev polynomial is as follows
\[ \begin{align} T_{n}(x) &= cos(n arccosx), & x \in [1, -1] \\ T_{n}(t) &= cosnt, & t \in [0, \pi] \end{align} \]
Similarly, the second kind of chebyshev polynomial is deduced in the same way, but for a slightly different equation:
\[ \begin{equation} ( 1 - x^2 ) y^{\prime \prime} - 3 x y^{\prime} + (n+2) n y = 0 \end{equation} \]
By a process which I still cannot understand, the solution is as follows
\[ \begin{equation} y(t) = \frac{sin((n+1)t)}{sint} \end{equation} \]
Then the second kind of chebyshev polynomial is as follows
\[ \begin{align} U_{n}(x) &= \frac{sin((n+1)arccosx)}{sin(arccosx)} \\ U_{n}(t) &= \frac{sin((n+1)t)}{sint} \end{align} \]
Apparently, one of the iteration formulas for first kind is as follows
\[ \begin{align} cos(n-1)t + cos(n+1)t = 2 cosnt cost \\ T_{n-1} + T_{n+1} = 2 T_{n} T_{1} \\ \end{align} \]
Similarly, one of the iteration formulas for second kind is as follows
\[ \begin{align} U_{n-1} + U_{n+1} = 2 U_{n} U_{1} \end{align} \]
\[ \begin{align} cos(\alpha + \beta) &= cos(\alpha) cos(\beta) - sin(\alpha) sin(\beta) \\ sin(\alpha - \beta) &= sin(\alpha) cos(\beta) - cos(\alpha) sin(\beta) \\ \int_{-1}^{1} cos(m \pi x) cos(n \pi x) dx &= 0, m \neq n \\ \int_{-1}^{1} sin(m \pi x) sin(n \pi x) dx &= 0, m \neq n \\ \int_{-1}^{1} cos(m \pi x) cos(n \pi x) dx &= 1, m = n \geq 1 \\ \int_{-1}^{1} sin(m \pi x) sin(n \pi x) dx &= 1, m = n \geq 1 \\ \int_{-1}^{1} cos(m \pi x) cos(n \pi x) dx &= 2, m = n = 0 \\ \int_{-1}^{1} sin(m \pi x) sin(n \pi x) dx &= 0, m = n = 0 \\ \int_{-1}^{1} sin(m \pi x) cos(n \pi x) dx &= 0, \\ \end{align} \]
It’s well known that Taylor series is as follows
\[ \begin{equation} f(x) = \sum_{n=1}^{\infty} \frac{f^{(n)}(x_{0})}{n!} (x-x_{0})^{n} \end{equation} \]
It requires the existence of \(f^{(n)}(x_{0})\), which is hardly satisfied.
So, introduce another series without powers, called Fourier series as follows
\[ \begin{equation} f(x) = \sum_{n=1}^{\infty} B_{n} sin(\frac{n \pi x}{L}) \end{equation} \]
Assume this series is converged, then how should \(B_{n}\) be found? Add assumption of \(f(x)\) is odd function. Use the orthogonality as follows
\[ \begin{align} \int_{-L}^{L} f(x) sin(\frac{m \pi x}{L}) dx &= \int_{-L}^{L} \sum_{n=1}^{\infty} B_{n} sin(\frac{n \pi x}{L}) \cdot sin(\frac{m \pi x}{L}) dx \\ &= B_{m} L \\ \end{align} \]
Then, the coefficients are obtained as follows
\[ \begin{equation} B_{m} = \frac{1}{L} \int_{-L}^{L} f(x) sin(\frac{m \pi x}{L}) dx, m \geq 0 \end{equation} \]
Similarly, the Fourier cosine series can be expressed as follows
\[ \begin{equation} f(x) = \sum_{n=1}^{\infty} A_{n} cos(\frac{n \pi x}{L}) \end{equation} \]
Assume this series is converged, then how should \(A_{n}\) be found? Add assumption of \(f(x)\) is even function. Use the orthogonality as follows
\[ \begin{align} \int_{-L}^{L} f(x) cos(\frac{m \pi x}{L}) dx &= \int_{-L}^{L} \sum_{n=1}^{\infty} A_{n} cos(\frac{n \pi x}{L}) \cdot cos(\frac{m \pi x}{L}) dx \\ &= A_{m} L \\ \end{align} \]
Then, the coefficients are obtained as follows
\[ \begin{align} A_{m} &= \frac{1}{L} \int_{-L}^{L} f(x) cos(\frac{m \pi x}{L}) dx, m \geq 1 \\ &= \frac{1}{2L} \int_{-L}^{L} f(x) cos(\frac{m \pi x}{L}) dx, m = 0 \\ \end{align} \]
We have the following property.
\[ \begin{align} \int_{0}^{\pi} cos(m t) cos(n t) dt &= 0, & m \neq n \\ &= \pi / 2, & m = n \neq 0 \\ &= \pi, & m = n = 0 \\ \end{align} \]
Then, \(x = cost\), so \(dx / dt = -sint\), i.e. \(dt = -1/sint \times dx\).
\[ \begin{align} \int_{0}^{\pi} cos(m t) cos(n t) dt &= \int_{1}^{-1} T_{m}(x) T_{n}(x) \frac{-1}{sint} dx \\ &= \int_{-1}^{1} \frac{T_{m}(x) T_{n}(x)}{\sqrt{1-x^{2}}} dx \\ &= 0, & m \neq n \\ &= \pi / 2, & m = n \neq 0 \\ &= \pi, & m = n = 0 \\ \end{align} \]
Given a value of \(N\), the function, \(T_{N}(x)\) is determined. Then the set of zeros for function \(T_{N}(x)\) is determined. So is the case of extrema.
\[ \begin{align} x_{k} &= cos(\frac{\pi (k - 1/2)}{N}), & k = 1, 2, ..., N \\ \tilde{x_{k}} &= cos(\frac{\pi k}{N}), & k = 0, 1, ..., N \end{align} \]
The discrete orthogonality of Chebyshev polynomial is described as follows.
If \(x_{k} (k = 1, 2, ..., N)\) are the \(N\) zeros of \(T_{N}(x)\), and if \(i, j < N\), then
\[ \begin{align} \sum_{k=1}^{N} T_{i}(x_{k}) T_{j}(x_{k}) &= \alpha_{i} \delta_{ij} \\ \alpha_{i} &= \frac{N}{2}, i \neq 0 \\ &= N, i = 0 \end{align} \]
If \(\tilde{x_{k}} (k = 0, 1, ..., N)\) are the \(N+1\) extremas of \(T_{N}(x)\), and if \(i, j < N\), then
\[ \begin{align} \sum_{k=0}^{N} {\prime \prime} T_{i}(\tilde{x_{k}}) T_{j}(\tilde{x_{k}}) &= \beta_{i} \delta_{ij} \\ \beta_{i} &= \frac{N}{2}, i \neq 0, N \\ &= N, i = 0, N \end{align} \]
The proof of the discrete orthogonality relationship can be referred in this document.
By the way in the Fourier series, employ the discrete orthogonality relationship in the determination of Chebyshev expansion coefficients and we can construct the Chebyshev expansion of arbitrary function.
An arbitrary function \(f(x)\) can be approximated in the interval \([-1, 1]\) by either one of the two formulae
\[ \begin{align} f(x) &= \sum_{j=0}^{N-1} {\prime} a_{j} T_{j}(x) \\ a_{j} &= \frac{2}{N} \sum_{k=1}^{N} f(x_{k}) T_{j}(x_{k}), j = 0, ..., N-1 \\ f(x) &= \sum_{j=0}^{N} {\prime \prime} b_{j} T_{j}(x) \\ a_{j} &= \frac{2}{N} \sum_{k=0}^{N} {\prime \prime} f(\tilde{x_{k}}) T_{j}(\tilde{x_{k}}), j = 0, ..., N \end{align} \]